Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enhance Device and Precision Handling, and Improve Error Messages in DepthPro Model #35

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

Mefisto04
Copy link

Pull Request Description

This pull request introduces several improvements to the DepthPro model code:

  1. Device and Precision Handling: The model now dynamically selects the appropriate device (cuda or cpu) based on availability. Additionally, handling for half precision (torch.half) has been implemented to enhance performance on compatible devices.

  2. Improved Error Messages: Enhanced error messages for loading model state dictionaries provide clearer feedback on any issues that arise during the loading process.

These enhancements aim to improve the usability and performance of the DepthPro model, making it more efficient and user-friendly.

@Mefisto04
Copy link
Author

hey @Amael, please review this.

Copy link

@carlos-bg carlos-bg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Might have issues using device in the proposed fashion in function create_model_and_transforms

device: torch.device = torch.device("cpu"),
precision: torch.dtype = torch.float32,
device: torch.device = torch.device("cuda" if torch.cuda.is_available() else "cpu"),
precision: torch.dtype = torch.float16 if device.type == 'cuda' else torch.float32,

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This use of device at this point doesn't seem to work for me.
I have to do this precision = torch.float16 if device.type == 'cuda' else torch.float32 in the body of the function

Copy link

@carlos-bg carlos-bg Oct 22, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When running depth-pro-run -i PATH_TO_MY_IMG_FILE, I also get a problem:

Traceback (most recent call last):
  File "miniconda3/envs/depth-pro/bin/depth-pro-run", line 8, in <module>
    sys.exit(run_main())
  File "ml-depth-pro/src/depth_pro/cli/run.py", line 150, in main
    run(parser.parse_args())
  File "ml-depth-pro/src/depth_pro/cli/run.py", line 68, in run
    prediction = model.infer(transform(image), f_px=f_px)
  File "miniconda3/envs/depth-pro/lib/python3.9/site-packages/torchvision/transforms/transforms.py", line 95, in __call__
    img = t(img)
  File "miniconda3/envs/depth-pro/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "miniconda3/envs/depth-pro/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
  File "miniconda3/envs/depth-pro/lib/python3.9/site-packages/torchvision/transforms/transforms.py", line 198, in forward
    return F.convert_image_dtype(image, self.dtype)
  File "miniconda3/envs/depth-pro/lib/python3.9/site-packages/torchvision/transforms/functional.py", line 243, in convert_image_dtype
    return F_t.convert_image_dtype(image, dtype)
  File "miniconda3/envs/depth-pro/lib/python3.9/site-packages/torchvision/transforms/_functional_tensor.py", line 73, in convert_image_dtype
    if torch.tensor(0, dtype=dtype).is_floating_point():
TypeError: tensor(): argument 'dtype' must be torch.dtype, not tuple

@Mefisto04
Copy link
Author

@carlos-bg i have made some changes please review this.

@Mefisto04
Copy link
Author

@carlos-bg please review

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants